Goto

Collaborating Authors

 adjoint action


Does provable absence of barren plateaus imply classical simulability? Or, why we need to rethink variational quantum computing

Cerezo, M., Larocca, Martin, García-Martín, Diego, Diaz, N. L., Braccia, Paolo, Fontana, Enrico, Rudolph, Manuel S., Bermejo, Pablo, Ijaz, Aroosa, Thanasilp, Supanut, Anschuetz, Eric R., Holmes, Zoë

arXiv.org Machine Learning

A large amount of effort has recently been put into understanding the barren plateau phenomenon. In this perspective article, we face the increasingly loud elephant in the room and ask a question that has been hinted at by many but not explicitly addressed: Can the structure that allows one to avoid barren plateaus also be leveraged to efficiently simulate the loss classically? We present strong evidence that commonly used models with provable absence of barren plateaus are also classically simulable, provided that one can collect some classical data from quantum devices during an initial data acquisition phase. This follows from the observation that barren plateaus result from a curse of dimensionality, and that current approaches for solving them end up encoding the problem into some small, classically simulable, subspaces. This sheds serious doubt on the non-classicality of the information processing capabilities of parametrized quantum circuits for barren plateau-free landscapes and on the possibility of superpolynomial advantages from running them on quantum hardware. We end by discussing caveats in our arguments, the role of smart initializations, and by highlighting new opportunities that our perspective raises.


Lie Neurons: Adjoint-Equivariant Neural Networks for Semisimple Lie Algebras

Lin, Tzu-Yuan, Zhu, Minghan, Ghaffari, Maani

arXiv.org Artificial Intelligence

This paper proposes an adjoint-equivariant neural network that takes Lie algebra data as input. Various types of equivariant neural networks have been proposed in the literature, which treat the input data as elements in a vector space carrying certain types of transformations. In comparison, we aim to process inputs that are transformations between vector spaces. The change of basis on transformation is described by conjugations, inducing the adjoint-equivariance relationship that our model is designed to capture. Leveraging the invariance property of the Killing form, the proposed network is a general framework that works for arbitrary semisimple Lie algebras. Our network possesses a simple structure that can be viewed as a Lie algebraic generalization of a multi-layer perceptron (MLP). This work extends the application of equivariant feature learning. Respecting the symmetry in data is essential for deep learning models to understand the underlying objects.